Causal discovery with continuous additive noise models
نویسندگان
چکیده
We consider the problem of learning causal directed acyclic graphs from an observational joint distribution. One can use these graphs to predict the outcome of interventional experiments, from which data are often not available. We show that if the observational distribution follows a structural equation model with an additive noise structure, the directed acyclic graph becomes identifiable from the distribution under mild conditions. This constitutes an interesting alternative to traditional methods that assume faithfulness and identify only the Markov equivalence class of the graph, thus leaving some edges undirected. We provide practical algorithms for finitely many samples, RESIT (regression with subsequent independence test) and two methods based on an independence score. We prove that RESIT is correct in the population setting and provide an empirical evaluation.
منابع مشابه
Nonlinear causal discovery with additive noise models
The discovery of causal relationships between a set of observed variables is a fundamental problem in science. For continuous-valued data linear acyclic causal models with additive noise are often used because these models are well understood and there are well-known methods to fit them to data. In reality, of course, many causal relationships are more or less nonlinear, raising some doubts as ...
متن کاملOn Causal Discovery with Cyclic Additive Noise Models
We study a particular class of cyclic causal models, where each variable is a (possibly nonlinear) function of its parents and additive noise. We prove that the causal graph of such models is generically identifiable in the bivariate, Gaussian-noise case. We also propose a method to learn such models from observational data. In the acyclic case, the method reduces to ordinary regression, but in...
متن کاملJustifying Additive Noise Model-Based Causal Discovery via Algorithmic Information Theory
A recent method for causal discovery is in many cases able to infer whether X causes Y or Y causes X for just two observed variables X and Y . It is based on the observation that there exist (non-Gaussian) joint distributions P (X,Y ) for which Y may be written as a function of X up to an additive noise term that is independent of X and no such model exists from Y to X . Whenever this is the ca...
متن کاملDependence Minimizing Regression with Model Selection for Non-Linear Causal Inference under Non-Gaussian Noise
The discovery of non-linear causal relationship under additive non-Gaussian noise models has attracted considerable attention recently because of their high flexibility. In this paper, we propose a novel causal inference algorithm called least-squares independence regression (LSIR). LSIR learns the additive noise model through minimization of an estimator of the squaredloss mutual information b...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of Machine Learning Research
دوره 15 شماره
صفحات -
تاریخ انتشار 2014